Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Data preprocessing method in software defect prediction
PAN Chunxia, YANG Qiuhui, TAN Wukun, DENG Huixin, WU Jia
Journal of Computer Applications    2020, 40 (11): 3273-3279.   DOI: 10.11772/j.issn.1001-9081.2020040464
Abstract416)      PDF (691KB)(598)       Save
Software defect prediction is a hot research topic in the field of software quality assurance. The quality of defect prediction models is closely related to the training data. The datasets used for defect prediction mainly have the problems of data feature selection and data class imbalance. Aiming at the problem of data feature selection, common process features of software development and the newly proposed extended process features were used, and then the feature selection algorithm based on clustering analysis was used to perform feature selection. Aiming at the data class imbalance problem, an improved Borderline-SMOTE (Borderline-Synthetic Minority Oversampling Technique) method was proposed to make the numbers of positive and negative samples in the training dataset relatively balanced, and make the characteristics of the synthesized samples more consistent with the actual sample characteristics. Experiments were performed by using the open source datasets of projects such as bugzilla and jUnit. The results show that the used feature selection algorithm can reduce the model training time by 57.94% while keeping high F-measure value of the model; compared to the defect prediction model obtained by using the original method to process samples, the model obtained by the improved Borderline-SMOTE method respectively increase the Precision, Recall, F-measure, and AUC (Area Under the Curve) by 2.36 percentage points, 1.8 percentage points, 2.13 percentage points and 2.36 percentage points on average; the defect prediction model obtained by introducing the extended process features has an average improvement of 3.79% in F-measure value compared to the model without the extended process features; compared with the models obtained by methods in the literatures, the model obtained by the proposed method has an average increase of 15.79% in F-measure value. The experimental results prove that the proposed method can effectively improve the quality of the defect prediction model.
Reference | Related Articles | Metrics
Urban traffic signal control based on deep reinforcement learning
SHU Lingzhou, WU Jia, WANG Chen
Journal of Computer Applications    2019, 39 (5): 1495-1499.   DOI: 10.11772/j.issn.1001-9081.2018092015
Abstract1207)      PDF (850KB)(758)       Save
To meet the requirements for adaptivity, and robustness of the algorithm to optimize urban traffic signal control, a traffic signal control algorithm based on Deep Reinforcement Learning (DRL) was proposed to control the whole regional traffic with a control Agent contructed by a deep learning network. Firstly, the Agent predicted the best possible traffic control strategy for the current state by observing continously the state of the traffic environment with an abstract representation of a location matrix and a speed matrix, because the matrix representation method can effectively abstract vital information and reduce redundant information about the traffic environment. Then, based on the impact of the strategy selected on the traffic environment, a reinforcement learning algorithm was employed to correct the intrinsic parameters of the Agent constantly in order to maximize the global speed in a period of time. Finally, after several iterations, the Agent learned how to effectively control the traffic.The experiments in the traffic simulation software Vissim show that compared with other algorithms based on DRL, the proposed algorithm is superior in average global speed, average queue length and stability; the average global speed increases 9% and the average queue length decreases 13.4% compared to the baseline. The experimental results verify that the proposed algorithm can adapt to complex and dynamically changing traffic environment.
Reference | Related Articles | Metrics
Review of human activity recognition based on wearable sensors
ZHENG Zengwei, DU Junjie, HUO Meimei, WU Jianzhong
Journal of Computer Applications    2018, 38 (5): 1223-1229.   DOI: 10.11772/j.issn.1001-9081.2017112715
Abstract1036)      PDF (1238KB)(1076)       Save
Human Activity Recognition (HAR) has a wide range of applications in medical care, safety, and entertainment. With the development of sensor industry, sensors that can accurately collect human activity data have been widely used on wearable equipments such as wristband, watch and mobile phones. Compared with the behavior recognition method based on video images, sensor-based behavior recognition has the characteristics of low cost, flexibility and portability. Therefore, human activity recognition research based on wearable sensors has become an important research field. Data collection, feature extraction, feature selection and classification methods of HAR were described in detail, and the techniques commonly used in each process were analyzed. Finally, the main problems of HAR and the development directions were pointed out.
Reference | Related Articles | Metrics
Cross-domain personalized recommendation method based on scoring reliability
QU Liping, WU Jiaxi
Journal of Computer Applications    2018, 38 (11): 3081-3083.   DOI: 10.11772/j.issn.1001-9081.2018041390
Abstract522)      PDF (589KB)(382)       Save
In the cross-domain recommendation system, there are cases that some users randomly score the purchased items. Since the number of users who arbitrarily scores is relatively small, when the total number of scorings for the item purchased by the users who arbitrarily score is large, random scorings have less influence on the recommendation effect. However, when the total number of scorings for the item purchased by the users who arbitrarily score is relatively small, random scorings will produce a greater impact on the recommendation effect. To solve this problem, a cross-domain personalized recommendation method based on scoring reliability was proposed. Different thresholds were set for users according to the scoring reliability. When migrating the scores in the auxiliary domain to the target domain, if the total number of scorings for an item by a user was lower than the threshold of the user, the user's scorings of the item were not migrated to the target domain, otherwise migration was performed, thereby reducing the influence of the random scorings on the recommendation effect. The experimental results show that it is better to set the user's threshold personally for different scoring reliability than setting a uniform threshold for all users and not setting a threshold for users.
Reference | Related Articles | Metrics
Research of control plane' anti-attacking in software-defined network based on Byzantine fault-tolerance
GAO Jie, WU Jiangxing, HU Yuxiang, LI Junfei
Journal of Computer Applications    2017, 37 (8): 2281-2286.   DOI: 10.11772/j.issn.1001-9081.2017.08.2281
Abstract507)      PDF (941KB)(684)       Save
Great convenience has been brought by the centralized control plane of Software-Defined Network (SDN), but a lot of security risks have been introduced into it as well. In the light of single point failure, unknown vulnerabilities and back doors, static configuration and other security problems of the controller, a secure architecture for SDN based on Byzantine protocol was proposed, in which the Byzantine protocol was executed between controllers and each switching device was controlled by a controller view and control messages were decided by several controllers. Furthermore, the dynamics and heterogeneity were introduced into the proposed structure, so that the attack chain was broken and the capabilities of network active defense were enhanced; moreover, based on the quantification of the controller heterogeneity, a two-stage algorithm was designed to seek for the controller view, so that the availability of the network and the security of the controller view were ensured. Simulation results show that compared with the traditional structure, the proposed structure is more resistant to attacks.
Reference | Related Articles | Metrics
Parallel trajectory compression method based on MapReduce
WU Jiagao, XIA Xuan, LIU Linfeng
Journal of Computer Applications    2017, 37 (5): 1282-1286.   DOI: 10.11772/j.issn.1001-9081.2017.05.1282
Abstract473)      PDF (902KB)(471)       Save
The massive spatiotemporal trajectory data is a heavy burden to store, transmit and process, which is caused by the increase Global Positioning System (GPS)-enable devices. In order to reduce the burden, many kinds of trajectory compression methods were generated. A parallel trajectory compression method based on MapReduce was proposed in this paper. In order to solve the destructive problem of correlation nearby segmentation points caused by the parallelization, in this method, the trajectory was divided by two segmentation methods in which the segmentation points were interleaving firstly. Then, the trajectory segments were assigned to different nodes for parallel compression. Lastly, the compression results were matched and merged. The performance test and analysis results show that the proposed method can not only increase the compression efficiency significantly, but also eliminate the error which is caused by the destructive problem of correlation.
Reference | Related Articles | Metrics
Secure transmission method of mission planning system in white-box attack context
CUI Xining, DONG Xingting, MU Ming, WU Jiao
Journal of Computer Applications    2017, 37 (2): 483-487.   DOI: 10.11772/j.issn.1001-9081.2017.02.0483
Abstract679)      PDF (923KB)(518)       Save
Concerning the problem that the communication keys in transmission of mission planning system were easily stolen in White-Box Attack Context (WBAC), a new secure transmission method of mission planning system was proposed based on modified white-box Advanced Encryption Standard (white-box AES). First, the Advanced Encryption Standard (AES) was split into many lookup tables and the keys were embedded into these lookup tables, then the lookup tables were merged in accordance with the excuting order of the AES. Secondly, on the ground, different white-box AES programs were generated in accordance with the given white-box AES generation algorithms using different keys. In the end, the white-box AES programs were embedded in the security transmission of the mission planning system. When the key needed to be replaced, the original white-box AES program should be erased on the ground to generate a new white-box AES. Theoretical analysis shows that compared with the traditional secure transmission of mission planning system, the modified secure transmission method of mission planning system can make the attack complexity to 2 91, which achieves the sufficient security and can protect the communication key.
Reference | Related Articles | Metrics
Moving object detection method based on multi-information dynamic fusion
HE Wei, QI Qi, ZHANG Guoyun, WU Jianhui
Journal of Computer Applications    2016, 36 (8): 2306-2310.   DOI: 10.11772/j.issn.1001-9081.2016.08.2306
Abstract390)      PDF (843KB)(309)       Save
Aiming at the problems of simple fusion of spatio-temporal information and ignoring moving information in moving object detection based on visual saliency, a moving object detection method based on the dynamic fusion of visual saliency and moving information was proposed. Firstly, local and global saliencies of each pixel was computed by spatial characters extracted from an image, then the spatial salient map was calculated combining those saliencies by Bayesian criteria. Secondly, with the help of structured random forest, the motion boundaries were predicted to primarily orientate the moving objects, by which the motion boundary map was built. Then, according to the change of the spatial salient and motion boundary maps, the optimal fusion weights were determined dynamically. Finally, moving objects were calculated and marked by the dynamic fusion weights. The proposed approach not only inherits the advantages of saliency algorithm and moving boundary algorithm, but also overcomes their disadvantages. In the comparison experiments with the traditional background subtraction method and three-frame difference method, the detection rate and the false alarm rate of the proposed approach are improved at most more than 40%. Experimental results show that the proposed method can detect moving objects accurately and completely, and the adaptation to scene is promoted.
Reference | Related Articles | Metrics
Relevance model estimation based on stable semantic clustering
SUN Xinyu, WU Jiang, PU Qiang
Journal of Computer Applications    2016, 36 (5): 1313-1318.   DOI: 10.11772/j.issn.1001-9081.2016.05.1313
Abstract375)      PDF (1012KB)(354)       Save
To solve the problem of relevance model based on unstable clustering estination and its effect on retrieval performance, a new Stable Semantic Relevance Model (SSRM) was proposed. The feedback data set was first formed by using the top N documents from user initial query, after the stable number of semantic clusters had been detected, SSRM was estimated by those stable semantic clusters selected according to higher user-query similarity. Finally, the SSRM retrieval performance was verified by experiments. Compared with Relevance Model (RM), Semantic Relevance Model (SRM) and the clustering-based retrieval methods including Cluster-Based Document Model (CBDM), LDA-Based Document Model (LBDM) and Resampling, SSRM has improvement of MAP by at least 32.11%, 0.41%, 23.64%,19.59%, 8.03% respectively. The experimental results show that retrieval performance can benefit from SSRM.
Reference | Related Articles | Metrics
Vulnerability detection algorithm of DOM XSS based on dynamic taint analysis
LI Jie, YU Yan, WU Jiashun
Journal of Computer Applications    2016, 36 (5): 1246-1249.   DOI: 10.11772/j.issn.1001-9081.2016.05.1246
Abstract785)      PDF (801KB)(638)       Save
Concerning DOM XSS (Document Object Model (DOM)-based Cross Site Scripting (XSS)) vulnerability detection in Web client, a detection algorithm for DOM XSS vulnerability based on dynamic taint analysis was proposed. By constructing DOM model and modifying Firefox SpiderMonkey script engine, a dynamic taint analysis method based on the bytecode was used to detect DOM XSS vulnerabilities. First, taint data was marked by extending the attribute of the DOM object class and modifying the string encoding format of SpiderMonkey. Then, the execution route of the bytecode was traversed to generate the tainted data set. After that, all the output points which might trigger DOM XSS attacks were monitored to determine whether the application contained the DOM XSS vulnerabilities. In the experiment, a DOM XSS vulnerability detection system containing a crawler was designed and implemented. The experimental results show that the proposed algorithm can effectively detect the DOM XSS vulnerabilities, and the detection rate is about 92%.
Reference | Related Articles | Metrics
Human activity pattern recognition based on block sparse Bayesian learning
WU Jianning, XU Haidong, LING Yun, WANG Jiajing
Journal of Computer Applications    2016, 36 (4): 1039-1044.   DOI: 10.11772/j.issn.1001-9081.2016.04.1039
Abstract1057)      PDF (933KB)(500)       Save
It is difficult for the traditional Sparse Representation Classification (SRC) algorithm to enhance the performance of human activity recognition because of ignoring the correlation structure information hidden in sparse coefficient vectors of the test sample. To address this problem, a block sparse model-based human activity recognition approach was proposed. The human activity recognition problem was considered as a sparse representation-based classification problem on the basis of the inherent sparse block structure in human activity pattern. The block sparse Bayesian learning algorithm was used to solve the optimal sparse representation coefficients of a test sample for a linear combination of the training samples from the same class, and then the reconstruction residual of sparse coefficients was defined to determine the class of the test sample, which effectively improved the recognition rate of human activity pattern. The USC-HAD database containing different styles of human daily activity was selected to evaluate the effectiveness of the proposed approach. The experimental results show that the activity recognition rate of the proposed approach reaches 97.86%, which is increasd by 5% compared to the traditional human activity methods. These results demonstrate that the proposed method can effectively capture the discriminative information of the different activity pattern, and significantly improve the accuracy of human activity recognition.
Reference | Related Articles | Metrics
Block sparse Bayesian learning algorithm for reconstruction and recognition of gait pattern from wireless body area networks
WU Jianning, XU Haidong
Journal of Computer Applications    2015, 35 (5): 1492-1498.   DOI: 10.11772/j.issn.1001-9081.2015.05.1492
Abstract425)      PDF (1152KB)(723)       Save

In order to achieve the optimal performance of gait pattern recognition and reconstruction of non-sparse acceleration data from Wireless Body Area Networks (WBANs)-based telemonitoring, a novel approach to apply the Block Sparse Bayesian Learning (BSBL) algorithm for improving the reconstruction performance of non-sparse accelerometer data was proposed, which contributes to achieve the superior performance of gain pattern recognition. Its basic idea is that, in view of the gait pattern and Compressed Sensing (CS) framework of WBAN-based telemonitoring, the original acceleration-based data acquired at sensor node in WBAN was compressed only by spare measurement matrix (the simple linear projection algorithm), and the compressed data was transmitted to the remote terminal, where BSBL algorithm was used to perfectly recover the non-sparse acceleration data that assumed as block structure by exploiting intra-block correlation for further gait pattern recognition with high accuracy. The acceleration data from the open USC-HAD database including walking, running, jumping, upstairs and downstairs activities were employed for testing the effectiveness of the proposed method. The experiment results show that with acceleration-based data, the reconstruction performance of the proposed BSBL algorithm can significantly outperform some conventional CS algorithms for sparse data, and the best accuracy of 98% can be obtained by BSBL-based Support Vector Machine (SVM) classifier for gait pattern recognition. These results demonstrate that the proposed method not only can significantly improve the reconstruction performance of non-sparse acceleration data for further gait pattern recognition with high accuracy but also is very helpful for the design of low-cost sensor node hardware with lower energy consumption, which will be a potential approach for the energy-efficient WBAN-based telemonitoring of human gait pattern in further application.

Reference | Related Articles | Metrics
Hybrid trajectory compression algorithm based on multiple spatiotemporal characteristics
WU Jiagao, QIAN Keyu, LIU Min, LIU Linfeng
Journal of Computer Applications    2015, 35 (5): 1209-1212.   DOI: 10.11772/j.issn.1001-9081.2015.05.1209
Abstract523)      PDF (593KB)(763)       Save

In view of the problem that how to reduce the storage space of the trajectory data and improve the speed of data analysis and transmission in the Global Positioning System (GPS), a hybrid trajectory compression algorithm based on the multiple spatiotemporal characteristics was proposed in this paper. On the one hand, in the algorithm, a new online trajectory compression strategy based on the multiple spatiotemporal characteristics was adopted in order to choose the characteristic points more accurately by using the position, direction and speed information of GPS point. On the other hand, the hybrid trajectory compression strategy which combined online compression with batched compression was used, and the Douglas batched compression algorithm was adopted to do the second compression process of the hybrid trajectory compression. The experimental results show that the compression error of the new online trajectory compression strategy based on multiple spatiotemporal characteristics reduces significantly, although the compression ratio fells slightly compared with the existing spatiotemporal compression algorithm. By choosing appropriate cycle time of batching, the compression ratio and compression error of this algorithm are improved compared with the existing spatiotemporal compression algorithm.

Reference | Related Articles | Metrics
Malware behavior assessment system based on support vector machine
OUYANG Boyu, LIU Xin, XU Chan, WU Jian, AN Xiao
Journal of Computer Applications    2015, 35 (4): 972-976.   DOI: 10.11772/j.issn.1001-9081.2015.04.0972
Abstract625)      PDF (900KB)(644)       Save

Aiming at the problem that the classification accuracy in malware behavior analysis system was low,a malware classification method based on Support Vector Machine (SVM) was proposed. First, the risk behavior library which used software behavior results as characteristics was established manually. Then all of the software behaviors were captured and matched with the risk behavior library, and the matching results were converted to data suitable for SVM training through the conversion algorithm. In the selection of the SVM model, kernel function and parameters (C,g), a method combining the grid search and Genetic Algorithm (GA) was used to search optimization after theoretical analysis. A malware behavior assessment system based on SVM classification model was designed to verify the effectiveness of the proposed malware classification method. The experiments show that the false positive rate and false negative rate of the system were 5.52% and 3.04% respectively. It means that the proposed method outperforms K-Nearest Neighbor (KNN) and Naive Bayes (NB); its performance is at the same level with the BP neural network, however, it has a higer efficiency in training and classification.

Reference | Related Articles | Metrics
Face recognition algorithm based on low-rank matrix recovery and collaborative representation
HE Linzhi, ZHAO Jianmin, ZHU Xinzhong, WU Jianbin, YANG Fan, ZHENG Zhonglong
Journal of Computer Applications    2015, 35 (3): 779-782.   DOI: 10.11772/j.issn.1001-9081.2015.03.779
Abstract725)      PDF (744KB)(449)       Save

Since the face images might be not over-complete and they might be also corrupted under different viewpoints or different lighting conditions with noise, an efficient and effective method for Face Recognition (FR) was proposed, namely Robust Principal Component Analysis with Collaborative Representation based Classification (RPCA_CRC). Firstly, the face training dictionary D0 was decomposed into two matrices as the low-rank matrix D and the sparse error matrix E; Secondly, the test image could be collaboratively represented based on the low-rank matrix D; Finally, the test image was classified by the reconstruction error. Compared with SRC (Sparse Representation based Classification), the speed of RPCA_CRC on average is 25-times faster. Meanwhile, the recognition rate of RPCA_CRC increases by 30% with less training images. The experimental results show the proposed method is fast, effective and accurate.

Reference | Related Articles | Metrics
Multi-feature based descriptions for automated grading on breast histopathology
GONG Lei, XU Jun, WANG Guanhao, WU Jianzhong, TANG Jinhai
Journal of Computer Applications    2015, 35 (12): 3570-3575.   DOI: 10.11772/j.issn.1001-9081.2015.12.3570
Abstract596)      PDF (1207KB)(470)       Save
In order to assist in the fast and efficient diagnosis of breast cancer and provide the prognosis information for pathologists, a computer-aided diagnosis approach for automatically grading breast pathological images was proposed. In the proposed algorithm,cells of pathological images were first automatically detected by deep convolutional neural network and sliding window. Then, the algorithms of color separation based on sparse non-negative matrix factorization, marker controlled watershed, and ellipse fitting were integrated to get the boundary of each cell. A total of 203-dimensional image-derived features, including architectural features of tumor, texture and shape features of epithelial cells were extracted from the pathological images based on the detected cells and the fitted boundary. A Support Vector Machine (SVM) classifier was trained by using the extracted features to realize the automated grading of pathological images. In order to verify the proposed algorithm, a total of 49 Hematoxylin & Eosin (H&E)-stained breast pathological images obtained from 17 patients were considered. The experimental results show that,for 100 ten-fold cross-validation trials, the features with the cell shape and the spatial structure of organization of pathological image set successfully distinguish test samples of low, intermediate and high grades with classification accuracy of 90.20%. Moreover, the proposed algorithm is able to distinguish high grade, intermediate grade, and low grade patients with accuracy of 92.87%, 82.88% and 93.61%, respectively. Compared with the methods only using texture feature or architectural feature, the proposed algorithm has a higher accuracy. The proposed algorithm can accurately distinguish the grade of tumor for pathological images and the difference of accuracy between grades is small.
Reference | Related Articles | Metrics
Blind separation method for source signals with temporal structure based on second-order statistics
QIU Mengmeng ZHOU Li WANG Lei WU Jianqiang
Journal of Computer Applications    2014, 34 (9): 2510-2513.   DOI: 10.11772/j.issn.1001-9081.2014.09.2510
Abstract196)      PDF (685KB)(509)       Save

The objective of Blind Source Separation (BSS) is to restore the unobservable source signals from their mixtures without knowing the prior knowledge of the mixing process. It is considered that the potential source signals are spatially uncorrelated but temporally correlated, i.e. they have non-vanishing temporal structure. A second-order statistics based BSS method was proposed for such sources. The robust prewhitening was firstly performed on the observed mixing signals, where the dimension of the sources was estimated based on the Minimum Description Length (MDL) criterion. Then, the blind separation was realized by implementing the Singular Value Decomposition (SVD) on the time-delayed covariance matrix of the whitened signals. The simulation on separation of a group of speech signals proves the effectiveness of the algorithm, and the performance of the algorithm was measured by Signal-to-Interference Ratio (SIR) and Performance Index (PI).

Reference | Related Articles | Metrics
Bearing fault diagnosis method based on dual singular value decomposition and least squares support vector machine
LI Kui FAN Yugang WU Jiande
Journal of Computer Applications    2014, 34 (8): 2438-2441.   DOI: 10.11772/j.issn.1001-9081.2014.08.2438
Abstract328)      PDF (738KB)(397)       Save

In order to solve the difficult problem that the different number of singular values affects the accuracy of fault identification, caused by Singular Value Decomposition (SVD) for different signals. A fault diagnosis method based on dual SVD and Least Squares Support Vector Machine (LS-SVM) was put forward. The proposed method could adaptively choose effective singular values by using the curvature spectrum of singular values for reconstructing a signal. SVD was carried out again to acquire the same number of orthogonal components and its energy entropy was calculated to construct the feature vector. Finally, it could be used in the LS-SVM classification model for fault identification. Compared with the method of using limited principal singular values as feature vector, the results show that the proposed method applied to the bearing fault diagnosis improves the accuracy of 13.34%. Also, it is feasible and valid.

Reference | Related Articles | Metrics
Implementation algorithm of spherical screen projection system via internal projection
CHEN Ke WU Jianping
Journal of Computer Applications    2014, 34 (3): 810-814.   DOI: 10.11772/j.issn.1001-9081.2014.03.0810
Abstract542)      PDF (1019KB)(385)       Save

Addressing the issue of computer processing in the internal spherical screen projection, an internal spherical screen projection algorithm was proposed based on virtual spherical transform and virtual fisheye lens mapping. Concerning the spherical screen output distortion caused by irregular fisheye projection, a sextic polynomial of distortion correction algorithm based on the equal-solid-angle mapping function was presented to approximate any fisheye mapping function to eliminate the distortion. The six coefficients of the polynomial could be obtained via solving a linear algebra equation. The experimental results show this method is able to completely eradicate the spherical screen projection distortion. Addressing the illumination distribution modification stemming from the spherical screen projection, an illumination correction algorithm based on the cosine of the projecting angles was also proposed to eliminate the illumination distribution change. The experimental results show the illumination distribution correction method successfully recovers the originally severely modified illumination distribution into the illumination distribution almost identical to the original picture. This algorithm has theoretical instructive importance and significant practical application values for design and software development of the spherical projection systems.

Related Articles | Metrics
Sentiment analysis on Web financial text based on semantic rules
WU Jiang TANG Chang-jie LI Taiyong CUI Liang
Journal of Computer Applications    2014, 34 (2): 481-485.  
Abstract878)      PDF (922KB)(1530)       Save
In order to effectively improve the accuracy of sentiment orientation and intensity analysis of unstructured Web financial text, a sentiment analysis algorithm for Web financial text based on semantic rule (SAFT-SR) was proposed. The algorithm extracted features of financial text based on Apriori, constructed financial sentiment lexicon and semantic rules to recognize sentiment unit and intensity, and figured out the sentiment orientation and intensity of text. Experiment results demonstrate that SAFT-SR is a promising algorithm for sentiment analysis on financial text. Compared with Ku’s algorithm, in sentiment orientation classification, SAFT-SR has better classification performance and increases F-measure, recall and precision; in sentiment intensity analysis, SAFT-SR reduces error and is closer to expert mark.
Related Articles | Metrics
Reconfigurable hybrid task scheduling algorithm
SHEN Dhu ZHU Zhiyu WU Jiang
Journal of Computer Applications    2014, 34 (2): 387-390.  
Abstract601)      PDF (705KB)(394)       Save
An important component of reconfigurable task scheduling is how to hide and reduce the configuration time. A reconfigurable hybrid task scheduling algorithm was proposed at solving the problem that the hybrid task was relevant for software and hardware simultaneously. The task and its chronological order should be figured out first by means of pre-configuration and priority algorithm and the successive task should be hidden into the run-time for predecessor task afterwards. In the meantime, the strategy of configuration reuse can be adopted in order to reduce the quantity of configuration for same tasks. Compared to the existing algorithms, the new algorithm is much more effective and its cost is less.
Related Articles | Metrics
Sparse Bayesian learning for credit risk evaluation
LI Taiyong WANG Huijun WU Jiang ZHANG Zhilin TANG Changjie
Journal of Computer Applications    2013, 33 (11): 3094-3096.  
Abstract849)      PDF (609KB)(426)       Save
To solve the low classification accuracy and poor interpretability of selected features in traditional credit risk evaluation, a new model using Sparse Bayesian Learning (SBL) to evaluate personal credit risk (SBLCredit) was proposed in this paper. The SBLCredit utilized the advantages of SBL to get as sparse as possible solutions under the priori knowledge on the weight of features, which led to both good classification performance and effective feature selection. SBLCredit improved the classification accuracy of 4.52%, 6.40%, 6.26% and 2.27% averagely when compared with the state-of-the-art K-Nearest Neighbour (KNN), Nave Bayes, decision tree and support vector machine respectively on real-world German and Australian credit datasets. The experimental results demonstrate that the proposed SBLCredit is a promising method for credit risk evaluation with higher accuracy and fewer features.
Related Articles | Metrics
New disparity estimation method for multiview video based on Mean Shift
HU Bo DAI Wanchang XIAO Zhijian WU Jianping HU Jie
Journal of Computer Applications    2013, 33 (08): 2297-2299.  
Abstract573)      PDF (620KB)(403)       Save
Concerning the high computation complexity of disparity estimation for multiview video encoding, a new method for disparity estimation based on Mean Shift was proposed. The relationship between disparity vector and motion vector in the spatio-temporal domain was analyzed, and the prediction disparity vector was calculated. The initial searching position for disparity matching was confirmed to be the initial value of iteration calculation by Mean Shift, and the macroblock's optimum matching in reference frame was achieved. The experimental results show that the proposed method can save more than 94% encoding time with a negligible drop in rate distortion compared with the full search algorithm. Compared to the fast searching algorithm, this method saves more than 10% encoding time and improves rate distortion.
Related Articles | Metrics
Remote sensing service discovery mechanism based on trusted QoS clustering
YAO Jianhua WU Jiamin NIU Wenjia TONG Endong
Journal of Computer Applications    2013, 33 (02): 587-591.   DOI: 10.3724/SP.J.1087.2013.00587
Abstract1069)      PDF (878KB)(354)       Save
Web service technology has been utilized in the remote sensing area to increase the dynamics and scalability of remote sensing resources. However, the underlying remote sensing data of sensing services is characterized by the large-span frequent change, while the Quality of Service (QoS) values given by the service provider has the open and untrusted characteristics as well. These two aspects reduce the efficiency and accuracy of the remote service discovery, which brings forth new challenge for effective service discovery utilization in remote sensing applications. To resolve these issues, a new remote sensing service discovery mechanism based on trusted QoS clustering was proposed, in which the underlying service response time, updating frequency and the upper evaluation standards would be used to measure the QoS. Based on the QoS measurement, the updating frequency-adaptive QoS detection and updating mechanism was developed. Through the clustering, the proposed approach could increase the QoS credibility of remote sensing service. The experiment shows that the proposed approach can improve the efficiency of remote sensing service discovery and user satisfaction.
Related Articles | Metrics
Check valve's fault detection with wavelet packet's kernel principal component analysis
TIAN Ning FAN Yugang WU Jiande HUANG Guoyong WANG Xiaodong
Journal of Computer Applications    2013, 33 (01): 291-294.   DOI: 10.3724/SP.J.1087.2013.00291
Abstract951)      PDF (599KB)(516)       Save
High pressure piston diaphragm pump is the most important power source of the pipeline transportation. To solve the problem of on-line monitoring on the fault of internal piston, the authors put forward a detection method based on acoustic emission signal's wavelet packet frequency and Kernel Principal Component Analysis (KPCA). Firstly, the author adopted wavelet packet to deal with the acoustic emission data to get each frequency band energy value. Secondly, the authors used KPCA to decompose the energy in high dimensional space to find the feature model, and made use of statistics SPE and T2 in feature model to make detection on fault signal. Finally, the authors conducted experiments to verify the statistics of acoustic emission of GEHO diaphragm pump's check valve. In comparison with the PCA method, the proposed method can make on-line monitoring on fault of internal piston fast and accurate, so it has good application prospect on the domain of the high pressure piston diaphragm pump's non-destructive fault detection.
Reference | Related Articles | Metrics
Research on adaptive time-varying terminal sliding mode control
HUANG Guoyong HU Jichen WU Jiande FAN Yugang WANG Xiaodong
Journal of Computer Applications    2013, 33 (01): 222-225.   DOI: 10.3724/SP.J.1087.2013.00222
Abstract755)      PDF (569KB)(512)       Save
To resolve the problem of poor robustness when reaching the Terminal sliding mode control, a time-varying sliding mode control method was proposed. A nonlinear time-varying sliding mode surface was designed after analyzing the influences of designed parameters of sliding mode surface to the performances of system. To deal with the disturbances of a class of Multi-Input Multi-Output (MIMO) nonlinear system, a disturbance observer system was constructed. According to the disturbance observer system, the external disturbances were approached on-line by adjusting the weights. The simulation results show that, the settle-time of the proposed scheme is less than that of PID control by 80%. The proposed method has no overshoots. The simulation results demonstrate that the proposed design can be used on the control of MIMO nonlinear system.
Reference | Related Articles | Metrics
Detection and defense scheme for selective forwarding attacks in wireless sensor network
FU Xiang-yan LI Ping WU Jia-ying
Journal of Computer Applications    2012, 32 (10): 2711-2715.   DOI: 10.3724/SP.J.1087.2012.02711
Abstract847)      PDF (956KB)(471)       Save
To improve the detection rate for malicious node and the defensive ability of system,a detection method based on optimal random routing algorithm and neighbor node monitoring was proposed, which was against the selective forwarding attack about Wireless Sensor Network (WSN). This method created the forward path by introducing some parameters such as distance, trust degree, etc. At the same time, it also used the node monitor scheme to detect and defend malicious node during the routing discovery and selection process. Simulation was completed at Matlab environment and performance comparison was done with other methods. Analysis and simulation results show that this method is effective for detecting selective forwarding attack and can ensure reliable packets delivery to destination using relatively less energy.
Reference | Related Articles | Metrics
New colorful images segmentation algorithm based on level set
CHEN Yuan-tao XU Wei-hong WU Jia-ying
Journal of Computer Applications    2012, 32 (03): 749-751.   DOI: 10.3724/SP.J.1087.2012.00749
Abstract908)      PDF (641KB)(562)       Save
Since the functional form in consideration is of non-convex variational nature, the calculation results of the image segmentation model often fall into local minimum. Based on the global vector-valued image segmentation of active contour, the global vector-valued image segmentation and image denoising were integrated in a new variational form within the framework of global minimum. The new model was easy to construct and of less computation. Compared to the classical level set method, tedious repetition of the level set could be avoided. With the analyses on artificial images and real images, the new method is verified to have better segmentation results.
Reference | Related Articles | Metrics
Social emotional optimization algorithm based on quadratic interpolation method
WU Jian-na CUI Zhi-hua LIU Jing
Journal of Computer Applications    2011, 31 (09): 2522-2525.   DOI: 10.3724/SP.J.1087.2011.02522
Abstract1264)      PDF (708KB)(384)       Save
Social Emotional Optimization Algorithm (SEOA) is a new swarm intelligent population-based optimization algorithm to simulate the human social behaviors. The individual decision-making ability and individual emotion which have impact on optimization results were taken into account, so the diversity of the algorithm has been improved a lot than common swam intelligence algorithms. However, the local search capacity needs to be updated. Quadratic interpolation method is better-behaved in local search. Therefore, the introduction of it into SEOA will improve the search capability. According to the test for the optimization performance by using benchmark functions, it is proved that the local search ability can be improved by introducing quadratic interpolation method into SEOA, thus increasing the global search capability.
Related Articles | Metrics
Study and implementation of optimization mechanism for hybrid P2P spatial indexing network
WU Jia-gao SHAO Shi-wei HUA Zheng ZOU Zhi-qiang HU Bin
Journal of Computer Applications    2011, 31 (09): 2301-2304.   DOI: 10.3724/SP.J.1087.2011.02301
Abstract1387)      PDF (615KB)(612)       Save
In allusion to the insufficiency of current P2P Geographic Information System (GIS) in utilizing network resources of clients, based on analyzing and summarizing the existing hybrid P2P spatial indexing network, a new idea of group strategy was proposed in view of practice. In this idea, peers with the same spatial data semantics were joined in the same group in which the burden of query was shared by group members together. Furthermore, a replacement algorithm of current index nodes and backup strategy were proposed to improve the query performance and stability of the overall network. The experimental results indicate that the indexing network with group strategy can effectively make use of clients' network resources and improve the query performance when a large number of queries request concurrently.
Related Articles | Metrics